13 research outputs found

    Profile control chart based on maximum entropy

    Full text link
    Monitoring a process over time is so important in manufacturing processes to reduce the wastage of money and time. The purpose of this article is to monitor profile coefficients instead of a process mean. In this paper, two methods are proposed for monitoring the intercept and slope of the simple linear profile, simultaneously. The first one is linear regression, and another one is the maximum entropy principle. A simulation study is applied to compare the two methods in terms of the second type of error and average run length. Finally, two real examples are presented to demonstrate the ability of the proposed chart

    New statistical control limits using maximum copula entropy

    Full text link
    Statistical quality control methods are noteworthy to produced standard production in manufacturing processes. In this regard, there are many classical manners to control the process. Many of them have a global assumption around distributions of the process data. They are supposed to be normal, which is clear that it is not always valid for all processes. Such control charts made some false decisions that waste funds. So, the main question while working with multivariate data set is how to find the multivariate distribution of the data set, which saves the original dependency between variables. Up to our knowledge, a copula function guarantees the dependence on the result function. But it is not enough when there is no other functional information about the statistical society, and we have just a data set. Therefore, we apply the maximum entropy concept to deal with this situation. In this paper, first of all, we find out the joint distribution of a data set, which is from a manufacturing process that needs to be control while running the production process. Then, we get an elliptical control limit via the maximum copula entropy. In the final step, we represent a practical example using the stated method. Average run lengths are calculated for some means and shifts to show the ability of the maximum copula entropy. In the end, two real data examples are presented

    A note on interval estimation for the mean of inverse Gaussian distribution

    Get PDF
    In this paper, we study the interval estimation for the mean from inverse Gaussian distribution. This distribution is a member of the natural exponential families with cubic variance function. Also, we simulate the coverage probabilities for the confidence intervals considered. The results show that the likelihood ratio interval is the best interval and Wald interval has the poorest performance

    Discrete (Dynamic) Cumulative Residual Entropy in Bivariate case

    No full text
    Cumulative residual entropy (CRE) is a new measure of uncertainty for continuous distributions which has been introduced by Rao et al. [27] and its discrete version has been defined by Baratpour and Bami [4]. The present paper addresses the question of extending the definition of CRE and its dynamic version to bivariate setup in discrete case and study its properties. We show that the proposed measure is invariance under increasing one-to-one transformation and has additive property. Also, a lower bound for discrete bivariate CRE based on Shannon entropy is obtained. Further more, we introduce scalar and vector bivariate dynamic CRE and their connections with well-known reliability measures such as the discrete bivariate mean residual life time. Finally, the bivariate version of the hazard rate, mean residual life and cumulative residual entropy are obtained for bivariate geometric distribution

    A view on Bhattacharyya bounds for inverse Gaussian distributions

    No full text
    Bhattacharyya matrix, Bhattacharyya bound, Inverse Gaussian distribution, Failure rate, Coefficient variation, Mode, Moment generating function, Natural exponential family,
    corecore